|
The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states and , represented as density operators that are subparts of a quantum system, the joint quantum entropy is a measure of the total uncertainty or entropy of the joint system. It is written or , depending on the notation being used for the von Neumann entropy. Like other entropies, the joint quantum entropy is measured in bits, i.e. the logarithm is taken in base 2. In this article, we will use for the joint quantum entropy. ==Background== In information theory, for any classical random variable , the classical Shannon entropy is a measure of how uncertain we are about the outcome of . For example, if is a probability distribution concentrated at one point, the outcome of is certain and therefore its entropy . At the other extreme, if is the uniform probability distribution with possible values, intuitively one would expect is associated with the most uncertainty. Indeed such uniform probability distributions have maximum possible entropy . In quantum information theory, the notion of entropy is extended from probability distributions to quantum states, or density matrices. For a state , the von Neumann entropy is defined by : Applying the spectral theorem, or Borel functional calculus for infinite dimensional systems, we see that it generalizes the classical entropy. The physical meaning remains the same. A maximally mixed state, the quantum analog of the uniform probability distribution, has maximum von Neumann entropy. On the other hand, a pure state, or a rank one projection, will have zero von Neumann entropy. We write the von Neumann entropy (or sometimes . 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「joint quantum entropy」の詳細全文を読む スポンサード リンク
|